VCI-LSTM: Vector Choquet Integral-Based Long Short-Term Memory

نویسندگان

چکیده

Choquet integral is a widely used aggregation operator on 1-D and interval-valued information, since it able to take into account the possible interaction among data. However, there are many cases where information taken vectorial, such as long short-term memories (LSTM). LSTM units kind of recurrent neural networks that have become one most powerful tools deal with sequential they power controlling flow. In this article, we first generalize standard admit an input composed by $n$ -dimensional vectors, which produces vector output. We study several properties construction methods integrals (VCIs). Then, use in place summation operator, introducing way new VCI-LSTM architecture. Finally, proposed two problems: 1) image classification; 2) text classification.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks

Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources...

متن کامل

the effects of keyword and context methods on pronunciation and receptive/ productive vocabulary of low-intermediate iranian efl learners: short-term and long-term memory in focus

از گذشته تا کنون، تحقیقات بسیاری صورت گرفته است که همگی به گونه ای بر مثمر ثمر بودن استفاده از استراتژی های یادگیری لغت در یک زبان بیگانه اذعان داشته اند. این تحقیق به بررسی تاثیر دو روش مختلف آموزش واژگان انگلیسی (کلیدی و بافتی) بر تلفظ و دانش لغوی فراگیران ایرانی زیر متوسط زبان انگلیسی و بر ماندگاری آن در حافظه می پردازد. به این منظور، تعداد شصت نفر از زبان آموزان ایرانی هشت تا چهارده ساله با...

15 صفحه اول

Long Short-term Memory

Model compression is significant for the wide adoption of Recurrent Neural Networks (RNNs) in both user devices possessing limited resources and business clusters requiring quick responses to large-scale service requests. This work aims to learn structurally-sparse Long Short-Term Memory (LSTM) by reducing the sizes of basic structures within LSTM units, including input updates, gates, hidden s...

متن کامل

Long Short-Term Memory

Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, ...

متن کامل

Anomaly Detection for Temporal Data using Long Short-Term Memory (LSTM)

We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investiga...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Fuzzy Systems

سال: 2023

ISSN: ['1063-6706', '1941-0034']

DOI: https://doi.org/10.1109/tfuzz.2022.3222035